Stochastic ADMM for Nonsmooth Optimization

نویسندگان

  • Hua Ouyang
  • Niao He
  • Alexander G. Gray
چکیده

Alternating Direction Method of Multipliers (ADMM) gained lost of attention due to LargeScale Machine Learning demands. • Classic (70’s) and flexible, Survey paper: (Boyd 2009) • Applications: compressed sensing (Yang & Zhang, 2011), image restoration (Goldstein & Osher, 2009), video processing and matrix completion (Goldfarb et al., 2010) • Recent variations: Linearized (Goldfarb et al., 2010; Zhang et al., 2011; Yang & Yuan, 2012), Accelerated (Goldfarb et al., 2010) and Online (Wang & Banerjee, 2012) ADMM • Global convergence proved in 80s ((Gabay, 1983; Eckstein & Bertsekas, 1992)) • Recent progress on rate of convergence: O(1/T ) for convex functions (He’11) • We propose a linearized stochastic ADMM algorithm; applies to a more general class of convex and nonsmooth objective functions, beyond the smooth and separable least squares loss used in lasso.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mini-Batch Stochastic ADMMs for Nonconvex Nonsmooth Optimization

In the paper, we study the mini-batch stochastic ADMMs (alternating direction method of multipliers) for the nonconvex nonsmooth optimization. We prove that, given an appropriate mini-batch size, the mini-batch stochastic ADMM without variance reduction (VR) technique is convergent and reaches the convergence rate of O(1/T ) to obtain a stationary point of the nonconvex optimization, where T de...

متن کامل

Accelerated Stochastic Gradient Method for Composite Regularization

Regularized risk minimization often involves nonsmooth optimization. This can be particularly challenging when the regularizer is a sum of simpler regularizers, as in the overlapping group lasso. Very recently, this is alleviated by using the proximal average, in which an implicitly nonsmooth function is employed to approximate the composite regularizer. In this paper, we propose a novel extens...

متن کامل

Stochastic Alternating Direction Method of Multipliers

The Alternating Direction Method of Multipliers (ADMM) has received lots of attention recently due to the tremendous demand from large-scale and data-distributed machine learning applications. In this paper, we present a stochastic setting for optimization problems with non-smooth composite objective functions. To solve this problem, we propose a stochastic ADMM algorithm. Our algorithm applies...

متن کامل

Towards an optimal stochastic alternating direction method of multipliers

We study regularized stochastic convex optimization subject to linear equality constraints. This class of problems was recently also studied by Ouyang et al. (2013) and Suzuki (2013); both introduced similar stochastic alternating direction method of multipliers (SADMM) algorithms. However, the analysis of both papers led to suboptimal convergence rates. This paper presents two new SADMM method...

متن کامل

Global Convergence of ADMM in Nonconvex Nonsmooth Optimization

In this paper, we analyze the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, φ(x0, . . . , xp, y), subject to coupled linear equality constraints. Our ADMM updates each of the primal variables x0, . . . , xp, y, followed by updating the dual variable. We separate the variable y from xi’s as it has a spe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1211.0632  شماره 

صفحات  -

تاریخ انتشار 2012